High capacity associative memories and connection constraints

نویسندگان

  • Neil Davey
  • Rod Adams
چکیده

High capacity associative neural networks can be built from networks of perceptrons, trained using simple perceptron training. Such networks perform much better than those trained using the standard Hopfield one shot Hebbian learning. An experimental investigation into how such networks perform when the connection weights are not free to take any value is reported. The three restrictions investigated are: a symmetry constraint, a sign constraint and a dilution constraint. The selection of these constraints is motivated by both engineering and biological considerations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The capacity of the Hopfield associative memory

Techniques from,coding theory are applied to study rigorously the capacity of the Hopfield associative memory. Such a memory stores n -tuple of + 1’s. The components change depending on a hardlimited version of linear functions of all other components. With symmetric connections between components, a stable state is ultimately reached. By building up the connection matrix as a sum-of-outer prod...

متن کامل

Generalized Asymmetrical Bidirectional Associative Memory*

A classical bidirectional associative memory (BAM) suffers from low storage capacity and abundance of spurious memories though it has the properties of good generalization and noise immunity. In this paper, Hamming distance in recall procedure of usual asymmetrical BAM is replaced with modified Hamming distance by introducing weighting matrix into connection matrix. This generalization is valid...

متن کامل

A new synthesis approach for feedback neural networks based on the perceptron training algorithm

In this paper, a new synthesis approach is developed for associative memories based on the perceptron training algorithm. The design (synthesis) problem of feedback neural networks for associative memories is formulated as a set of linear inequalities such that the use of perceptron training is evident. The perceptron training in the synthesis algorithms is guaranteed to converge for the design...

متن کامل

A New Synthesis Approach for Feedback NeuralNetworks Based on the Perceptron

| In the present paper, a new synthesis approach is developed for associative memories based on the perceptron training algorithm. The design (synthesis) problem of feedback neural networks for associative memories is formulated as a set of linear inequalities such that the use of perceptron training is evident. The perceptron training in the synthesis algorithms is guaranteed to converge for t...

متن کامل

Connectivity in Real and Simulated Associative Memories

Finding efficient patterns of connectivity in sparse associative memories is a difficult problem. It is, however, one that real neuronal networks, such as the mammalian cortex, must have solved. We have investigated computational models of sparsely connected associative memories and found that some patterns of connectivity produce both good performance and efficient use of resources. This could...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Connect. Sci.

دوره 16  شماره 

صفحات  -

تاریخ انتشار 2004